23 research outputs found

    First-Order Logic Proofs using Connectionist Constraints Relaxation

    Get PDF
    This paper considers the problem of expressing predicate calculus in connectionist networks that are based on energy minimization. Given a first-order-logic knowledge base and a bound k, a symmetric network is constructed (like a Boltzman machine or a Hopfield network) that searches for a proof fora given query. If a resolution-based proof is length no longer k exists, then the global minima of the energy function that is associated with the network represent such proofs. If no proof exist then the global minima indicate the lack of a proof. The network that is generated is of size polynomial in the bound k and the knowledge size. There are no restrictions on the type of logic formulas that can be represented. An extension enables the mechanism to copy with inconsistency in the knowledge base; i.e. a query is entailed if there exists a proof supporting the query and no better (or equally good ) proof exists supporting its negation. Fault tolerance is obtained since symbolic roles are dynamically assigned to units and many units are competing for those roles

    A Fault Tolerant Connectionist Architecture for Construction of Logic Proofs

    Get PDF
    This chapter considers the problems of expressing logic and constructing proofs in fault tolerant connectionist networks that are based on energy minimalism. Given a first-order-logic knowledge base and a bound k, a symmetric network is constructed (like a Boltzman machine or a Hopfield network) that searches for a proof for a given query. If a resolution-based proof of length no longer than k exists, then the global minima of the energy function that is associated with the network represent such proofs. If no proof exist then the global minima indicate the lack of a proof. The network that is generated is of size polynomial in the bound k and the knowledge size. There are no restrictions on the type of logic formulas that can be represented. Most of the chapter discusses the representation of propositional formulas and proofs; however, an extension is presented that allows the representation of unrestricted first-order logic formulas (predicate calculus). Fault tolerance is obtained using a binding technique that dynamically assigns symbolic roles to winner takes all units

    Propositional Non-Monotonic Reasoning and Inconsistency in Symmetric Neural Networks

    Get PDF
    We define a notion of reasoning using world-rank-functions, independently of any symbolic language. We then show that every symmetric neural network (like Hopfield networks or Boltzman machines) can be seen as performing a search for a satisfying model of some knowledge that is wired into the network\u27s topology and weights. Several equivalent languages are then shown to describe symbolically the knowledge embedded in these networks. We extend propositional calculus by augmenting assumptions with penalties. The extended calculus (called penalty logic ) is useful in expressing default knowledge, preference between arguments, and reliability of assumptions in an inconsistent knowledge base. Every symmetric network can be described by this language and any sentence in the language is translatable to such network. A proof-theoretic reasoning procedure supplements the model-theoretic definitions and gives an intuitive understanding of the non-monotonic behavior of the reasoning mechanism. Finally we sketch a connectionist inference engine for penalty logic and discuss its capabilities and limitations

    Symmetric Neural Nets and Propositional Logic Satisfiability

    Get PDF
    Connectionist networks with symmetric weights (like Hopfield networks and Boltman Machines) use gradient descent to find a minimum for quadratic energy functions. We show an equivalence between the problem of satisfiability in propositional calculus and the problem of minimizing those energy functions. The equivalence is in the sense that for an satisfiable Well Formed Formula (WFF) we can find a quadratic function that describes it, such that the set of solutions that minimize the function is equal to the set of truth assignments that satisfy the WFF. We also show that in the same sense every quadratic energy function describes some satisfiable WFF. Algorithms are given to transform any propositional WFF into an energy function that describes it and vice versa. High-order models that use Sigma-Pi units are shown to be equivalent to the standard quadratic models with additional hidden units. An algorithm to convert high-order networks to low-order ones is used to implement a satisfiability problem-solver on a connectionist network. The results give better understanding of the role of hidden units and of the limitations and capabilities of symmetric connectionist models. The techniques developed for the satisfiability problem may be applied to a wide range of other problems, such as: associative memories, finding maximal consistent subsets, automatic deduction and even non-monotonic reasoning

    Converting Binary Thresholds Networks into Equivalent Symmetric Networks

    Get PDF
    We give algorithms to convert any network of binary threshold units (that does not oscillate) into an equivalent network with symmetric weight matrix (like Hopfield networks [Hopfield 82] or Boltzmann machines [Hinton, Sejnowski 88]). The motivation for the transformation is dual: a) to demonstrate the expressive power of symmetric networks; i.e. binary threshold networks (that do not oscillate) are subsumed in the energy minimization paradigm; 2) to use network modules (developed for the spreading activation paradigm for example), within the energy minimization paradigm. Thus optimization [Tank, Hopfield 88] and approximation of hard problems can be combined with efficient modules, that solve tractable sub-problems; 3)to unify a large class of networks under one paradigm. For acyclic networks we give an algorithm that generates an equivalent symmetric network that is of the same size and performs as efficiently as the original network. For the conversion of recurrent networks, we introduce several techniques; however, the generated networks may be larger (in size) then the original

    Representing and Learning Propositional Logic in Symmetric Connectionist Networks

    Get PDF
    The chapter presents methods for efficiently representing logic formulas in connectionist networks that perform energy minimization. Algorithms are given for transforming any formula into a network in linear time and space and for learning representations of unknown formulas by observing examples of satisfying truth assignments. The relaxation process that underlies networks of energy minimization reveals an efficient hill climbing algorithm for satisfiability problems. Experimental results indicate that the parallel implementation of the algorithm with give extremely good average-case performance, even for large-scale, hard satisfiability problems (randomly generated)

    The equivalence of connectionist energy minimization and propositional calculus satisfiability

    Get PDF
    Quadratic energy minimization is the essence of certain connectionist models. We define high order connectionist models to support the minimization of high order energy functions and we prove that high order energy functions are equivalent to quadratic ones. We show that the standard quadratic models can minimize high order functions using additional hidden units and we demonstrate trade-offs of size (number of hidden units), order of the model, and fan-out. We prove an equivalence between the problem of satisfiability in propositional calculus and the problem of minimization of energy functions. An energy function describes a Well Formed Formula (WFF) if the set of solutions to the minimization of the function is equal to the set of models (truth assignments) that satisfy the WFF. We show that every satisfiable WFF is described by some energy function and that every energy function describes some WFF. Algorithms are given to transform any propositional WFF into an energy function that describes it and vice versa. A connectionist propositional inference engine that features incremental updating of the knowledge can be implemented using these algorithms. The results have applications in reasoning and AI, and also give a better understanding of the limitations and the capabilities of connectionist energy minimization models

    Representation and Learning of Propositional Knowledge in Symmetric Connectionist Networks

    Get PDF
    The goal of this article is to construct a connectionist inference engine that is capable of representing and learning nonmotonic knowledge. An extended version of propositional calculus is developed and is demonstrated to be useful for nonmonotonic reasoning and for coping with inconsistency that may be a result of noisy, unreliable sources of knowledge. Formulas of the extended calculus (called penalty logic) are proved to be equivalent in a very strong sense to symmetric networks (like Hopfield networks and Boltzmann machines), and efficient algorithms are given for translating back and forth between the two forms of knowledge representation. The paper presents a fast learning procedure that allows symmetric networks to learn representations of unknown logic formulas by looking at examples. A connectionist inference engine is then sketched whose knowledge is either compiled from a symbolic representation or that is inductively learned from training examples. Finally, the paper shows that penalty logic can be used as an high-level specification language for connectionist networks, and as a framework into which several recent systems may be mapped
    corecore